Incremental Feature Transformation for Temporal Space
نویسندگان
چکیده
منابع مشابه
Incremental Feature Transformation for Temporal Space
Temporal Feature Space generates features sequentially over consecutive time frames, thus producing a very large dimensional feature space cumulatively in contrast to the one which generates samples over time. Pattern Recognition applications for such temporal feature space therefore have to withstand the complexities involved with waiting for the arrival of new features over time and handling ...
متن کاملIncremental Feature Subsetting useful for Big Feature Space Problems
Dimensionality Reduction process is a means to overcome curse of dimensionality in general. When all features are available together, it is a way to extract knowledge from a population in a big feature space. On the contrary, dimensionality reduction is intriguing when update to feature space is streaming and the question arises whether one could reduce the feature space as and when the feature...
متن کاملIncremental learning of feature space and classifier for face recognition
We have proposed a new approach to pattern recognition in which not only a classifier but also a feature space of input variables is learned incrementally. In this paper, an extended version of Incremental Principal Component Analysis (IPCA) and Resource Allocating Network with Long-Term Memory (RAN-LTM) are effectively combined to implement this idea. Since IPCA updates a feature space increme...
متن کاملCase Retrieval Using Nonlinear Feature-Space Transformation
Good similarity functions are at the heart of effective case-based reasoning. However, the similarity functions that have been designed so far have been mostly linear, weighted-sum in nature. In this paper, we explore how to handle case retrieval when the case base is nonlinear in similarity measurement, in which situation the linear similarity functions will result in the wrong solutions. Our ...
متن کاملFeature Space Transformation using Equation Discovery
In Machine Learning, the success and performance of learning often critically depends on the feature space provided. For instance, when learning a classifier f1, . . . , fn → c that maps features f1, . . . , fn to classes c, appropriate encodings of f1, . . . , fn are often as important as the choice of learning algorithm itself. This is particularly true for robot learning, where feature space...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: International Journal of Computer Applications
سال: 2016
ISSN: 0975-8887
DOI: 10.5120/ijca2016910737